Search results for " information theory"

showing 10 items of 51 documents

Extropy: Complementary Dual of Entropy

2015

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon's entropy function has a complementary dual function which we call "extropy." The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of distinct measures for any quantity that is not merely an event indicator. As with entropy, the maximum extropy distribution is also the uniform distribution, and both measures are invariant with respect to permutations of their mass functions. However, they behave quite differently in their assessments…

Bregman divergenceFOS: Computer and information sciencesStatistics and ProbabilitySettore MAT/06 - Probabilita' E Statistica MatematicaKullback–Leibler divergenceComputer Science - Information TheoryGeneral MathematicsFOS: Physical sciencesBinary numberMathematics - Statistics TheoryStatistics Theory (math.ST)Kullback–Leibler divergenceBregman divergenceproper scoring rulesGini index of heterogeneityDifferential entropyBinary entropy functionFOS: MathematicsEntropy (information theory)Statistical physicsDual functionAxiomMathematicsdifferential and relative entropy/extropy Kullback- Leibler divergence Bregman divergence duality proper scoring rules Gini index of heterogeneity repeat rate.Settore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniDifferential and relative entropy/extropyInformation Theory (cs.IT)Probability (math.PR)repeat ratePhysics - Data Analysis Statistics and ProbabilitydualityStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaMathematics - ProbabilityData Analysis Statistics and Probability (physics.data-an)Statistical Science
researchProduct

Shallow Reductionism and the Problem of Complexity in Psychology

2008

In his recent book The Mind Doesn't Work That Way, Fodor argues that computational modeling of global cognitive processes, such as abductive everyday reasoning, has not been successful. In this article the problem is analyzed in the framework of algorithmic information theory. It is argued that the failed approaches are characterized by shallow reductionism, which is rejected in favor of deep reductionism and nonreductionism.

Cognitive scienceReductionismAlgorithmic information theoryHistory and Philosophy of ScienceConnectionismPhilosophyCognitionGeneral PsychologyEpistemologyTheory & Psychology
researchProduct

Quantifying, characterizing, and controlling information flow in ultracold atomic gases

2011

We study quantum information flow in a model comprising of an impurity qubit immersed in a Bose-Einstein condensed reservoir. We demonstrate how information flux between the qubit and the condensate can be manipulated by engineering the ultracold reservoir within experimentally realistic limits. We place a particular emphasis on non-Markovian dynamics, characterized by a reversed flow of information from the background gas to the qubit and identify a controllable crossover between Markovian and non-Markovian dynamics in the parameter space of the model.

Condensed Matter::Quantum GasesPhysicsQuantum PhysicsFlux qubitFOS: Physical sciencesQuantum simulator-One-way quantum computerAtomic and Molecular Physics and OpticsPhase qubitOpen quantum systemQuantum Gases (cond-mat.quant-gas)QubitBECs entanglement quantum information theory open quantum systemsStatistical physicsQuantum informationAtomic physicsCondensed Matter - Quantum GasesQuantum Physics (quant-ph)Trapped ion quantum computerPhysical Review A
researchProduct

Multiscale Information Storage of Linear Long-Range Correlated Stochastic Processes

2019

Information storage, reflecting the capability of a dynamical system to keep predictable information during its evolution over time, is a key element of intrinsic distributed computation, useful for the description of the dynamical complexity of several physical and biological processes. Here we introduce a parametric approach which allows one to compute information storage across multiple timescales in stochastic processes displaying both short-term dynamics and long-range correlations (LRC). Our analysis is performed in the popular framework of multiscale entropy, whereby a time series is first "coarse grained" at the chosen timescale through low-pass filtering and downsampling, and then …

Conditional entropyFOS: Computer and information sciencesComputer scienceStochastic processDynamical system01 natural sciencesMeasure (mathematics)010305 fluids & plasmasMethodology (stat.ME)Multiscale Entropy Information Theory ComplexityAutoregressive model0103 physical sciencesState space010306 general physicsRepresentation (mathematics)AlgorithmStatistics - MethodologyParametric statistics
researchProduct

Entanglement generation between two spin-s magnetic impurities in a solid via electron scattering

2009

Abstract We present a scheme for generating entanglement between two magnetic impurities in a solid-state system via electron scattering. The scheme applies to impurities of arbitrary quantum spin number. We show that resonance conditions yield generation of a maximally entangled state of the impurities' spins, regardless of the value of the electron–impurity coupling constant and the impurity spin quantum number. The mechanism behind the scheme is explained in terms of resonance-induced selection rules.

Coupling constantPhysicsCondensed matter physicsquantum information theory transport in mesoscopic systemsSpin engineeringGeneral ChemistryQuantum entanglementCondensed Matter::Mesoscopic Systems and Quantum Hall EffectCondensed Matter PhysicsQuantum numberSpin quantum numberCondensed Matter::SuperconductivityQubitCondensed Matter::Strongly Correlated ElectronsGeneral Materials ScienceQuantum informationSpin (physics)
researchProduct

Algorithmic Information Theory and Computational Complexity

2013

We present examples where theorems on complexity of computation are proved using methods in algorithmic information theory. The first example is a non-effective construction of a language for which the size of any deterministic finite automaton exceeds the size of a probabilistic finite automaton with a bounded error exponentially. The second example refers to frequency computation. Frequency computation was introduced by Rose and McNaughton in early sixties and developed by Trakhtenbrot, Kinber, Degtev, Wechsung, Hinrichs and others. A transducer is a finite-state automaton with an input and an output. We consider the possibilities of probabilistic and frequency transducers and prove sever…

Discrete mathematicsAverage-case complexityAlgorithmic information theoryTheoryofComputation_COMPUTATIONBYABSTRACTDEVICESKolmogorov complexityDescriptive complexity theoryComputational physicsStructural complexity theoryTheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESDeterministic finite automatonAsymptotic computational complexityComputer Science::Formal Languages and Automata TheoryComputational number theoryMathematics
researchProduct

Combinatorial proofs of two theorems of Lutz and Stull

2021

Recently, Lutz and Stull used methods from algorithmic information theory to prove two new Marstrand-type projection theorems, concerning subsets of Euclidean space which are not assumed to be Borel, or even analytic. One of the theorems states that if $K \subset \mathbb{R}^{n}$ is any set with equal Hausdorff and packing dimensions, then $$ \dim_{\mathrm{H}} π_{e}(K) = \min\{\dim_{\mathrm{H}} K,1\} $$ for almost every $e \in S^{n - 1}$. Here $π_{e}$ stands for orthogonal projection to $\mathrm{span}(e)$. The primary purpose of this paper is to present proofs for Lutz and Stull's projection theorems which do not refer to information theoretic concepts. Instead, they will rely on combinatori…

FOS: Computer and information sciences28A80 (primary) 28A78 (secondary)General MathematicskombinatoriikkaCombinatorial proofComputational Complexity (cs.CC)01 natural sciencesCombinatoricsMathematics - Metric GeometryHausdorff and packing measures0103 physical sciencesClassical Analysis and ODEs (math.CA)FOS: Mathematics0101 mathematicsMathematicsAlgorithmic information theoryLemma (mathematics)Euclidean spacePigeonhole principle010102 general mathematicsOrthographic projectionHausdorff spaceMetric Geometry (math.MG)Projection (relational algebra)Computer Science - Computational ComplexityMathematics - Classical Analysis and ODEsfraktaalit010307 mathematical physicsmittateoria
researchProduct

Decoding algorithm for HL-codes and performance of the DHH-cryptosystem -- a candidate for post-quantum cryptography

2023

We give a decoding algorithm for a class of error-correcting codes, which can be used in the DHH-cryptosystem, which is a candidate for post-quantum cryptography, since it is of McEliece type. Furthermore, we implement the encryption and decryption algorithms for this cryptosystem and investigate its performance.

FOS: Computer and information sciencesComputer Science - Cryptography and SecurityComputer Science - Information TheoryInformation Theory (cs.IT)81P94 94A60 94B35Cryptography and Security (cs.CR)
researchProduct

End-to-end Optimized Image Compression

2016

We describe an image compression method, consisting of a nonlinear analysis transformation, a uniform quantizer, and a nonlinear synthesis transformation. The transforms are constructed in three successive stages of convolutional linear filters and nonlinear activation functions. Unlike most convolutional neural networks, the joint nonlinearity is chosen to implement a form of local gain control, inspired by those used to model biological neurons. Using a variant of stochastic gradient descent, we jointly optimize the entire model for rate-distortion performance over a database of training images, introducing a continuous proxy for the discontinuous loss function arising from the quantizer.…

FOS: Computer and information sciencesComputer Science - Information TheoryComputer Vision and Pattern Recognition (cs.CV)Information Theory (cs.IT)Computer Science - Computer Vision and Pattern RecognitionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONData_CODINGANDINFORMATIONTHEORY
researchProduct

Nash codes for noisy channels

2012

This paper studies the stability of communication protocols that deal with transmission errors. We consider a coordination game between an informed sender and an uninformed decision maker, the receiver, who communicate over a noisy channel. The sender's strategy, called a code, maps states of nature to signals. The receiver's best response is to decode the received channel output as the state with highest expected receiver payoff. Given this decoding, an equilibrium or "Nash code" results if the sender encodes every state as prescribed. We show two theorems that give sufficient conditions for Nash codes. First, a receiver-optimal code defines a Nash code. A second, more surprising observati…

FOS: Computer and information sciencesComputer Science::Computer Science and Game TheoryTheoretical computer scienceComputer scienceInformation Theory (cs.IT)Computer Science - Information TheoryStochastic gamejel:C72jel:D82Stability (learning theory)Data_CODINGANDINFORMATIONTHEORYManagement Science and Operations Researchsender-receiver game communication noisy channel91A28Computer Science ApplicationsComputer Science - Computer Science and Game TheoryBest responseCode (cryptography)Coordination gameQA MathematicsDecoding methodsCommunication channelComputer Science and Game Theory (cs.GT)Computer Science::Information Theory
researchProduct